منابع مشابه
Analog Computation Via Neural Networks
We pursue a particular approach to analog computation, based on dynamical systems of the type used in neural networks research. Our systems have a fixed structure, invariant in time, corresponding to an unchanging number of “neurons”. If allowed exponential time for computation, they turn out to have unbounded power. However, under polynomial-time constraints there are limits on their capabilit...
متن کاملGeneric Analog Neural Computation - The Epsilon Chip
An analog CMOS VLSI neural processing chip has been designed and fabricated. The device employs "pulse-stream" neural state signalljng, and is capable of computing some 360 million synaptic connections per secood. In addition to basic characterisation results. the performance of the chip in solving "real-world" problems is also demonstrated.
متن کاملVLSI design for analog neural computation
information into the cell body, and transmit the output electrical signals through the axon. The paper describes a VLSI design methodology for the implementation of analog artificial neural networks. Analog VLSI circuit techniques offers area-efficient implementation of the functions required in a neural network such as multiplication, summation and Sigmoid transfer function. However, the analo...
متن کاملEfficient computation via sparse coding in electrosensory neural networks.
The electric sense combines spatial aspects of vision and touch with temporal features of audition. Its accessible neural architecture shares similarities with mammalian sensory systems and allows for recordings from successive brain areas to test hypotheses about neural coding. Further, electrosensory stimuli encountered during prey capture, navigation, and communication, can be readily synthe...
متن کاملAnalog versus discrete neural networks
We show that neural networks with three-times continuously differentiable activation functions are capable of computing a certain family of n-bit boolean functions with two gates, whereas networks composed of binary threshold functions require at least omega(log n) gates. Thus, for a large class of activation functions, analog neural networks can be more powerful than discrete neural networks, ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Theoretical Computer Science
سال: 1994
ISSN: 0304-3975
DOI: 10.1016/0304-3975(94)90178-3